# Legal Text Pretraining
Legalbert Large 1.7M 2
A RoBERTa model pretrained on English legal and administrative texts, specializing in language understanding tasks in the legal domain
Large Language Model
Transformers English

L
pile-of-law
701
63
Legalbert Large 1.7M 1
A BERT large model pretrained on English legal and administrative texts using RoBERTa pretraining objectives
Large Language Model
Transformers English

L
pile-of-law
120
14
Custom Legalbert
BERT model optimized for the legal domain, pretrained from scratch on 37GB of legal ruling texts
Large Language Model English
C
casehold
12.59k
12
Featured Recommended AI Models